翻訳と辞書 |
Minimax estimator : ウィキペディア英語版 | Minimax estimator
In statistical decision theory, where we are faced with the problem of estimating a deterministic parameter (vector) from observations an estimator (estimation rule) is called minimax if its maximal risk is minimal among all estimators of . In a sense this means that is an estimator which performs best in the worst possible case allowed in the problem. ==Problem setup== Consider the problem of estimating a deterministic (not Bayesian) parameter from noisy or corrupt data related through the conditional probability distribution . Our goal is to find a "good" estimator for estimating the parameter , which minimizes some given risk function . Here the risk function is the expectation of some loss function with respect to . A popular example for a loss function is the squared error loss , and the risk function for this loss is the mean squared error (MSE). Unfortunately in general the risk cannot be minimized, since it depends on the unknown parameter itself (If we knew what was the actual value of , we wouldn't need to estimate it). Therefore additional criteria for finding an optimal estimator in some sense are required. One such criterion is the minimax criteria.
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Minimax estimator」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|